Coordinating different sensory inputs during development. Focus on "Early experience determines how the senses will interact".

نویسنده

  • Andrew J King
چکیده

Objects and events encountered in everyday life frequently generate cues that are registered by the sense organs of more than one modality. For instance, it is often the case when listening to someone’s voice that we also see their lips moving. The capacity of the brain to combine and coordinate the different sensory signals arising from a common source provides us with a unified perception of the world and is essential for directing attention and controlling movement within it. In this issue of the Journal of Neurophysiology (p. 921–926), Wallace and Stein show that experience during infancy can shape the way in which visual and auditory inputs interact to determine the responses of neurons in the superior colliculus (SC) a midbrain nucleus involved in the control of orienting movements. Interactions between the senses can improve the likelihood of detecting and responding to an event and of identifying and localizing it accurately. On the other hand, if incongruent information is provided by different sensory modalities, then our perception of the event in question can be degraded or altered. This is well illustrated in humans by the “McGurk effect.” Although speech comprehension can be improved, particularly in a noisy environment, by lip-reading, watching a person articulate one speech syllable while listening to another typically results in the perception of a third sound that represents a combination of what was seen and heard (McGurk and MacDonald 1976). The perceptual consequences of crossmodal interactions therefore depend on the binding together of appropriate multisensory signals, i.e., those originating from the source in question, as opposed to other, unrelated stimuli. This capacity to combine information across different sensory modalities to form a coherent multisensory representation of the world has its origin in the way in which different sensory systems interact during development. The importance of experience in this process has been demonstrated at a number of levels (Lickliter and Bahrick 2004) and particularly in the matching of spatial information provided by the different sensory systems. Recent studies have shown that multisensory convergence is more widespread in the brain than was previously thought to be the case (Ghazanfar and Schroeder 2006), but the SC has long been the region of choice for investigating the way in which the spatial cues provided by different sensory modalities are combined and integrated by individual neurons, both in adult animals and during the course of development (King 1999). There are two related reasons for this. First for each sensory modality, stimulus location is represented topographically in the SC to form overlapping maps of space. In principle, this allows the different sensory cues associated with a common source to activate a specific region of the SC motor map and therefore be transformed into motor commands that result in a change in gaze direction. Second, many of the neurons found in the deeper layers of this midbrain structure receive converging inputs from two or more sensory systems and generate higher spike discharge rates—and it is likely, in turn, more accurate orienting responses—when combinations of stimuli are delivered in close temporal and spatial proximity. Because spatial information is represented in different reference frames in the visual, auditory, and somatosensory systems, maintenance of intersensory map alignment in the SC requires that these signals are transformed into a common set of coordinates. Auditory and somatosensory receptive fields are indeed partially remapped into eye-centered coordinates, but this transformation appears to be incomplete, suggesting that multiple reference frames are employed (Pouget et al. 2002). The process of aligning the different representations of different modalities also depends on interactions that take place between the sensory inputs to the SC, particularly during development but also to a certain extent in later life too. It has been shown, for example, that shifting the visual representation relative to the head by optical (Bergan et al. 2005; Knudsen and Brainard 1991) or surgical (King et al. 1988) means can produce a corresponding shift in the auditory spatial tuning of SC neurons. This guiding role for vision is further supported by the finding that degradation of visual input during infancy results in both the emergence of auditory and somatosensory receptive fields that are either abnormally large (Wallace et al. 2004; Withington-Wray et al. 1990) or inappropriately located (King and Carlile 1993; Knudsen et al. 1991) and an absence of multisensory facilitation in the responses of SC neurons (Wallace et al. 2004). Now Wallace and Stein (2006) have extended these finds by maintaining kittens in the dark and periodically exposing them to temporally coincident but spatially incongruent visual and auditory stimuli to determine whether a systematic change in the spatial relationship of these cues could alter the way in which they are synthesized within the brain. Recordings made when the animals were mature revealed that some neurons had abnormally large receptive fields and failed to show crossmodal interactions when visual and auditory stimuli were presented together. This is in accord with the previously reported effects of dark rearing alone. By contrast, other neurons had relatively small visual and auditory receptive fields, which, as a result of a systematic displacement in auditory spatial tuning, showed little overlap. As in normally raised controls, multisensory enhancement could be elicited in these neurons when visual and auditory stimuli were presented together from within their respective receptive fields. Because of the misalignment of these receptive fields, however, this required the stimuli to be presented from different locations. Address for reprint requests and other correspondence: Dept. of Physiology, Anatomy and Genetics, Sherrington Bldg., Parks Road, University of Oxford, Oxford OX1 3PT, UK (E-mail: [email protected]). J Neurophysiol 97: 3–4, 2007. First published October 25, 2006; doi:10.1152/jn.01075.2006.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Enriched and Deprived Sensory Experience Induces Structural Changes and Rewires Connectivity during the Postnatal Development of the Brain

During postnatal development, sensory experience modulates cortical development, inducing numerous changes in all of the components of the cortex. Most of the cortical changes thus induced occur during the critical period, when the functional and structural properties of cortical neurons are particularly susceptible to alterations. Although the time course for experience-mediated sensory develo...

متن کامل

Early experience determines how the senses will interact.

Multisensory integration refers to the process by which the brain synthesizes information from different senses to enhance sensitivity to external events. In the present experiments, animals were reared in an altered sensory environment in which visual and auditory stimuli were temporally coupled but originated from different locations. Neurons in the superior colliculus developed a seemingly a...

متن کامل

Multimodal Contributions to Body Representation

Our body is a unique entity by which we interact with the external world. Consequently, the way we represent our body has profound implications in the way we process and locate sensations and in turn perform appropriate actions. The body can be the subject, but also the object of our experience, providing information from sensations on the body surface and viscera, but also knowledge of the bod...

متن کامل

Cross-modal interaction between vision and touch: the role of synesthetic correspondence.

At each moment, we experience a melange of information arriving at several senses, and often we focus on inputs from one modality and 'reject' inputs from another. Does input from a rejected sensory modality modulate one's ability to make decisions about information from a selected one? When the modalities are vision and hearing, the answer is "yes", suggesting that vision and hearing interact....

متن کامل

The Application of Tactile Experience in Urban Perception

Urban perception is the result of mutual transaction between human and environment and the process of perception is developed through the three continuous steps of “sensation”, “perception” and “cognition”. In the first step (sensorial perception), the environmental signals are received via sensorial sensors and each different sense based on its own essence, performance and ability has its own ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Journal of neurophysiology

دوره 97 1  شماره 

صفحات  -

تاریخ انتشار 2007